Overview
What is Mirantis Kubernetes Engine?
The Mirantis Kubernetes Engine (formerly Docker Enterprise, acquired by Mirantis in November 2019)aims to let users ship code faster. Mirantis Kubernetes Engine gives users one set of APIs and tools to deploy, manage, and observe secure-by-default, certified, batteries-included Kubernetes clusters…
Need insights on Mirantis Cloud Native Suite
With Docker consider it done
Superb tool for simplifying your IT problems and overcoming technical issues
Save space and time!
Docker Swarm: Simplify Multiple Deployments
Docker: If you don't bring up and take down a few tens of dockers a day, then you really need to start doing so
Productivity Booster
Docker: The Little "VM" That Could!
Linux everywhere!
Docker looks like the future of containers for development.
Docker for all your container needs!
Docker can decouple VM provisioning and deploy.
Very useful for testing purposes
Docker is containers within containers
Docker simplifies our custom build and deployment process
Pricing
Free
$0.00
Basic
$500.00
Entry-level set up fee?
- No setup fee
Offerings
- Free Trial
- Free/Freemium Version
- Premium Consulting/Integration Services
Starting price (does not include set up fee)
- $500 per year per node
Product Details
- About
- Tech Details
- FAQs
What is Mirantis Kubernetes Engine?
Mirantis Kubernetes Engine Technical Details
Deployment Types | Software as a Service (SaaS), Cloud, or Web-Based |
---|---|
Operating Systems | Unspecified |
Mobile Application | No |
Frequently Asked Questions
Comparisons
Compare with
Reviews and Ratings
(210)Community Insights
- Business Problems Solved
- Recommendations
Docker has proven to be a versatile tool with a wide range of use cases. Users have found that Docker simplifies the packaging and deployment of applications and services, allowing developers to match their development environment to production and eliminate cross-cutting software dependencies. It has been utilized as the backbone of a hosted app infrastructure, where every element is broken down into microservices deployed on the AWS cloud. Additionally, Docker has been instrumental in creating specialized microservices such as a Selenium Grid for automated web-based testing.
Moreover, Docker has played a crucial role in maintaining environmental consistency and streamlining deployment processes. It has enabled users to swiftly containerize Continuous Deployment and Integration pipelines, facilitating easy deployment and updates of the system and its environments. With Docker, users have been able to quickly deploy and monitor servers, firewalls, switches, and other components, providing a consistent and efficient environment for prototyping and testing. Another notable use case is spinning up new databases for microservices using Docker, ensuring consistency and independence across different environments.
Furthermore, Docker has integrated seamlessly with orchestration frameworks like Apache Mesos and Mesosphere Marathon. This combination has allowed for more efficient application development and deployment through effective management of containers. Docker has also demonstrated its utility in building server deployment files and running tests, enabling consistent deployments and reliable testing procedures.
In addition to these technical applications, Docker has proved to be valuable in hosting MySQL databases for production websites. Its stability, security features, and easy provisioning of identical instances have made it a preferred choice for users. Moreover, Docker has been extensively used in CI builds as it enables the creation of custom Linux images and seamless deployment of the latest code from the Docker registry.
The flexibility offered by Docker comes to the forefront when it comes to testing practices. It provides a highly configurable environment that makes cross-platform testing significantly more efficient. Users have leveraged Docker for both automated website/application testing pipelines as well as creating flexible environments for manual testing. Moreover, Docker has acted as a viable alternative to custom build and deploy solutions, offering a more flexible and decentralized process.
Notably, Docker has been embraced by a large global financial services provider to enhance efficiency and agility in application development. This adoption has resulted in increased innovation and productivity within the organization. Another significant benefit of using Docker is its ability to provide identical application environments across multiple deployment environments, leading to the deployment of more stable applications.
Furthermore, Docker has played a role in differentiating between server/compute infrastructure and application infrastructure. Operations teams can efficiently manage the cluster of servers, while application developers can run containers on the cluster, ensuring a clear separation and easier management of the two layers.
Teams have leveraged Docker for various development and deployment practices. Engineers can build applications in the same environment, eliminating local configuration issues that often arise when working across different setups. Docker has been particularly useful for WordPress development, replacing tools like Vagrant and providing tighter integration with Windows Hyper-V and better performance.
One of the significant advantages of Docker is its ability to containerize applications, resulting in consistent deployment environments across different stages and compatibility with various cloud platforms. This has greatly simplified the deployment process for users and enhanced their productivity. Additionally, Docker has been highly beneficial for the development team in resolving issues related to different setups on Windows, Linux, and Mac operating systems, while also providing easy configurations for automation QA.
Docker's impact extends beyond software development into the realm of research reproducibility. Users have developed Docker containers to encapsulate research pipelines, leveraging GitHub and DockerHub as public repositories. This approach has effectively addressed the challenge of ensuring reproducibility in research experiments.
Moreover, Docker Swarm has been employed to deploy internal applications in a managed cluster, successfully tackling scaling and load balancing issues during peak business hours. The combination of Docker with Kubernetes has also gained popularity among teams for containerizing projects and facilitating the development of microservices.
Overall, Docker's value proposition lies in its ability to provide consistent development environments, prevent deployment issues, streamline configurations, enhance testing efficiency, and simplify the overall software packaging and deployment processes. Its widespread usage across various industries highlights its robustness, ease of setup, community support through open-sourced images, and its ability to create and test configurations as needed. Docker has become an indispensable tool for many organizations seeking to optimize their software development lifecycle while improving productivity and innovation.
Based on the reviews, here are the three most common recommendations:
-
Users recommend trying Docker for deploying web services and running micro-services. They suggest doing tutorials to learn how to create Dockerfiles and docker-compose files correctly. Additionally, they advise considering whether Docker is necessary or if statically linked binaries can be used instead.
-
Users also recommend using Docker for QA environments and setting up developers with the environment they need. They find Docker to be an easy-to-use development tool with great rewards for a small amount of effort. However, some users caution that while Docker is a good solution, there may be better alternatives available.
-
Another common recommendation is to carefully consider the use of Docker in a workflow and discuss its usability within the organization before implementing it. Users emphasize the importance of learning the basics of Docker and understanding if continuous integration/deployment is the right approach. They also mention that Docker has a supportive community and is widely used in the industry.
Overall, users suggest experimenting with Docker, especially for new applications or running micro-services. They recommend taking advantage of Docker's simplicity and portability while being mindful of specific requirements and considering other options if needed.
Attribute Ratings
Reviews
(1-11 of 11)Need insights on Mirantis Cloud Native Suite
- Scale small components individually
- Scale Independent native components
- Creating docker images
- More support for high level components need to be improved
- Skewed scaling of conponents can be improved
- Offline support
With Docker consider it done
- Easy to control.
- Setting up network across different containers is quite easy.
- Mapping of resources with host machine is easy.
- Setting up networking from scratch is painful.
- Resources required for setting Docker Enterprise are huge.
- User interface needs to be improved and made more user friendly.
It [is] quite expensive when it comes to pricing and almost all the features can be utilized using the community edition which is free.
- It is advanced tool for balancing loads and managing routes.
- It is easy to edit container contents.
- Alerts are very useful which helps us handling the entire network.
- I am mostly satisfied with all of its features but I have faced issues in the continuous data storage no doubt they offer features like Dockers Data Volumes but still there is much room for improvement.
- Moreover, I am a happy user of this platform.
Docker: The Little "VM" That Could!
- Usability is great after the initial setup.
- Installation is a breeze.
- The ability to knock down a container and rebuild it from scratch is fantastic.
- It would be nice if Docker had its own frontend GUI.
- The CLI is very difficult unless you have decent amount of Linux experience.
- Stacks are still a mystery to me.
Very useful for testing purposes
I should say I know Docker is meant for something more pro and I'm a light user; I don't push a Docker image completely to a server, but for testing purposes it has been extremely useful. You can use CLI for changing things, you can create different databases, alter them and load them again, etc.
- Creating and deleting "server" images is way easier than normal. You can change configurations and it basically creates a virtual machine on your computer, but WAY easier than using VMWare yourself. It's a layer on top of that.
- Getting images is pretty easy, there are many on the internet and you can get help from the community in some cases you are not sure what to do
- The commands in Docker work pretty well. There is good documentation and you can achieve almost anything considering a virtual machine.
- Maintaining stability between environments thanks to the Docker app. You can have the SAME exact app on different systems (MacOS vs Win) and it will behave 99% the same.
- As a NON-heavy user, definitely it's a bit intimidating in the onboarding phase. It's hard to understand what everything is for and how to use it appropriately. As I wrote before, this could be because I'm not a hard developer myself.
- At least on Windows 10, I always have problems turning it on. It has problems starting, I need to quit/start again, and then it works. I'm supposed to have a stable version, not sure if it's only myself.
If you, like me, know something about developing but very little about Linux and distributions, be ready to test a lot of things and have a hard time achieving what you want. That's not Docker's fault, it's because it's meant for other users who are more "experts" in that field.
Docker is containers within containers
- Setting up Docker containers helps developers to replicate the production environment frim their local machine in a virtual box. This helps keep development and debugging simple.
- Portability is really helpful. You can easily shift from AWS to GCP within minutes.
- Docker images are version-controlled just like github commits.
- User friendly - creating the virtual environment takes a lot more time than running the shell script to set up the environment.
- Docker containers are for running applications and not for data containers. Having that feature would be awesome.
- Docker image and containers prune command to force-delete all the images and containers as a cleanup.
Docker for Quick and Easy Container Deployments
- Container environment consistency
- Lightweight deployments
- Cross-platform
- Hyper-V can cause problems for configuration on Windows environments
Docker for QA: dockerized Selenium Grid
For me as an automation QA lead, it's mainly used for our Selenium Grid. Our grid is running on AWS, and I configured it via Docker. I use docker-compose to start it up and to scale how many browsers should be started. Using only Docker was already a huge help, as we didn't really have to worry about the configurations and it was easy to use the same setup for more instances, but combined with the scaling option of docker-compose it proved to be a really convenient.
- Develop on multiple platforms. The same Docker image can be used on Linux/Mac/Windows.
- Ease of configuration. It's very easy to create a base image for your project. There are a lot of already existing images you can use to start with.
- Scalability. If you need more than just one instance of the same image, it's just a command to spin up more.
- Finding the perfect configuration: it's very easy to find some basic configurations, but fine-tuning it can be challenging.
- Understanding the concept can be difficult at first. Most of the question I get from colleagues are around: what's happening inside the docker, how we can see the logs what happens inside etc. One you have the concepts, you can easily do these, but this can be a rough beginning.
- Sometimes difficult to set it up. I'm mainly hearing about this from colleagues using Windows.
Docker, Pros, Cons, Use Cases
- Docker brings in an API for container management, an image format and a possibility to use a remote registry for sharing containers. This scheme benefits both developers and system administrators.
- Docker allows for portability across machines. The application and all its dependencies can be bundled into a single container that is independent of the host version of Linux kernel, platform distribution, or deployment model. This container can be transfered to another machine that runs Docker and executed there without compatibility issues.
- Docker has a lightweight footprint and minimal overhead. Docker images are typically very small, which facilitates rapid delivery and reduces the time to deploy new application containers.
- Docker allows for sharing. You can use a remote repository to share your container with others.
- Docker provides great version control and component reuse. You can track successive versions of a container, inspect differences, or roll-back to previous versions. Containers reuse components from the preceding layers, which makes them noticeably lightweight.
- Docker has got into the bad habit of wrapping open source Linux technologies and promoting them in a way that makes it feel like Docker invented it. They did it to LXC and they are doing it to aufs and overlayfs.
- Docker is not very developer friendly.
- Docker containers are currently for software, not for data.
- New Docker versions cause breakage. You need all kinds of subtle regressions between Docker versions. It’s constantly breaking unpredictable stuff in unexpected ways.
- Docker does not have a command to clean older images, lifecycle management.
- Lack of kernel support.
Simple Up and Running Script Based Containerization
- Its topology isolation is in my opinion an unbeatable feature. In our systems we have the need of parallel Java 7 and 8 versions to be running together. Without Docker that would not have been made possible.
- Docker Swarm, taking care of our load-balance characteristics so needed for our systems is a must have.
- Docker composer is a very powerful feature, therein I can have my containers scripted and each of its continuous integration and deployment separated with each of its own concerns isolated whilst all being nicely bootstrapped together under the same "docker-compose up" command.
- Some commands are not very intuitive. In order to have an entire swarm properly functioning [specifically for the scenario we have at our company] wasn't a simple task, having to maintain a very wide range of environment variables safely and nicely kept and good for use. The pipeline to have such a topology ready wasn't simple to figure out how to come up with.
- Some volumes, if not properly shut down when its necessary, will take up to all your disk space. The extra -v attribute wasn't too obvious to use when removing an specific volume leading us to a huge headache.
- Some containers, though exposed as official ones at docker.hub.com, are very space and memory consuming. We have do figure out our own containers for pretty much everything, even though the services that were necessary in the containers were pretty vanilla.
Docker for research reproducibility
- Light weight and portable.
- Easy to share (either by Docker file or as a container/DockerHub).
- Same environment regardless of users operating system.
- Docker is mainly a command line tool; delivering a graphical users interface out of a container is still a problem.
- When Docker runs within a VM as in the case of Mac and Windows users, transferring files in and out of Docker is challenging.
- Since with Mac and Windows users Docker runs within a VM, there's an extensive overhead that need careful consideration.